Lessons Learned from an Experiment in Crowdsourcing Complex Citizen Engineering Tasks with Amazon Mechanical Turk

نویسندگان

  • Matthew Staffelbach
  • Peter Sempolinski
  • David Hachen
  • Ahsan Kareem
  • Tracy Kijewski-Correa
  • Douglas Thain
  • Daniel Wei
  • Gregory R. Madey
چکیده

America’s dated infrastructure is failing to keep pace with its burgeoning population. In fact, the average grade in ASCE’s (American Society of Civil Engineers) 2013 report card for America’s infrastructure was a D+, with a 3.6 trillion dollar estimated investment needed by 2020 and needs for inspection and assessment that far surpass available manpower. Crowdsourcing is increasingly being seen as one potentially powerful way of increasing the supply of labor for problem solving tasks, but there are a number of concerns over the quality of the data or analysis conducted. This is a significant concern when dealing with civil infrastructure for obvious reasons: flawed data could lead to loss of lives. Our goal was to determine if workers on Mechanical Turk were capable of developing basic engineering analysis skills using only the training afforded by comprehensive tutorials and guided questionnaires.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lessons Learned from Crowdsourcing Complex Engineering Tasks

CROWDSOURCING Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simu...

متن کامل

Creating a Bi-lingual Entailment Corpus through Translations with Mechanical Turk: $100 for a 10-day Rush

This paper reports on experiments in the creation of a bi-lingual Textual Entailment corpus, using non-experts’ workforce under strict cost and time limitations ($100, 10 days). To this aim workers have been hired for translation and validation tasks, through the CrowdFlower channel to Amazon Mechanical Turk. As a result, an accurate and reliable corpus of 426 English/Spanish entailment pairs h...

متن کامل

Crowd ideation of supervised learning problems

Crowdsourcing is an important avenue for collecting machine learning data, but crowdsourcing can go beyond simple data collection by employing the creativity and wisdom of crowd workers. Yet crowd participants are unlikely to be experts in statistics or predictive modeling, and it is not clear how well non-experts can contribute creatively to the process of machine learning. Here we study an en...

متن کامل

"The Whole Is Greater Than the Sum of Its Parts": Optimization in Collaborative Crowdsourcing

In this work, we initiate the investigation of optimization opportunities in collaborative crowdsourcing. Many popular applications, such as collaborative document editing, sentence translation, or citizen science resort to this special form of human-based computing, where, crowd workers with appropriate skills and expertise are required to form groups to solve complex tasks. Central to any col...

متن کامل

Crowdsourcing Document Relevance Assessment with Mechanical Turk

We investigate human factors involved in designing effective Human Intelligence Tasks (HITs) for Amazon’s Mechanical Turk. In particular, we assess document relevance to search queries via MTurk in order to evaluate search engine accuracy. Our study varies four human factors and measures resulting experimental outcomes of cost, time, and accuracy of the assessments. While results are largely in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1406.7588  شماره 

صفحات  -

تاریخ انتشار 2014